-
Notifications
You must be signed in to change notification settings - Fork 5
REF: Refactor PET model #204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
2fb555e to
90eb886
Compare
|
Had another look at this. I had introduced a |
| ) | ||
|
|
||
| # ToDo | ||
| # Are timepoints and xlim features that all PET models require ?? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mnoergaard question for you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would say so, since we are dealing with motion correction, and hence would need temporal information. Currently, the PET model needs both the sampling midpoints and the scan’s total duration to work, so enforcing their presence in the base class keeps the API consistent. When new PET models that do not need temporal information are introduced, we can revisit this requirement; until then, providing dataset.midframe and dataset.total_duration (as done in the unit tests) is the intended usage.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I see that this is related to #204 (review). I think the naming should then be made consistent across the PET data class and the model class. Also, I do not see why the model is given the timepoints, if these are hold by the PET data class, i.e. from #204 (review)
(...) midframe is where the dataset stores real-world frame timing; timepoints/_x is the copy of those timings handed to the model
then we should not be providing the model with a copy of them.
If we make the parallel with the DWI class, the gradients are obtained from the dataset, e.g.:
gradient = self._dataset.gradients[:, index]
Also, can the xlim name be made somehow more descriptive or is it a name that is commonly used within the PET domain?
src/nifreeze/model/pet.py
Outdated
| raise NotImplementedError("Fitting with held-out data is not supported") | ||
|
|
||
| # ToDo | ||
| # Does not make sense to make timepoints be a kwarg if it is provided as a named parameter to __init__ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mnoergaard question for you.
src/nifreeze/model/pet.py
Outdated
|
|
||
| # ToDo | ||
| # What is the gtab equivalent of PET ? | ||
| model_str = getattr(self, "_model_class", "") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This may not be necessary.
src/nifreeze/model/pet.py
Outdated
|
|
||
| # ToDo | ||
| # What are the gtab (and S0 if any) equivalent of PET ? | ||
| if n_models == 1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
|
||
| # ToDo | ||
| # Does not make sense to make timepoints be a kwarg if it is provided as a named parameter to __init__ | ||
| timepoints = kwargs.get("timepoints", None) or self._x |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| # Does the below apply to PET ? Martin has the return None statement | ||
| # if index is None: | ||
| # raise RuntimeError( | ||
| # f"Model {self.__class__.__name__} does not allow locking.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another question for @mnoergaard.
90eb886 to
afb11da
Compare
|
The file |
afb11da to
cbf417d
Compare
|
Re #204 (comment) Compared to afb11da, in cbf417d I am discarding the idea of keeping a parallel of the dMRI base model in the PET base model. The PET model tests pass and the notebook runs now. It gives a result that is almost the same as in #203 (comment), with only minor differences in X and Y rotation start/ends. However, the parallelization was lost in the refactoring and the notebook takes almost 2 hours now compared to 30 minutes. So I need to look into that. |
cbf417d to
10af03f
Compare
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #204 +/- ##
==========================================
- Coverage 75.95% 75.78% -0.17%
==========================================
Files 24 24
Lines 1439 1458 +19
Branches 166 168 +2
==========================================
+ Hits 1093 1105 +12
- Misses 275 280 +5
- Partials 71 73 +2 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
6318c50 to
908faa8
Compare
|
Re #204 (comment), the throttle issue was due to this line: nifreeze/src/nifreeze/model/pet.py Line 199 in 6318c50
which I had borrowed from nifreeze/src/nifreeze/model/dmri.py Line 107 in cb13535
Now changed to nifreeze/src/nifreeze/model/pet.py Line 199 in 908faa8
To match more closely nifreeze/src/nifreeze/model/dmri.py Line 107 in cb13535
And the speed is back. Results in plots, as I was seeing yesterday (cf. the above comment), almost the same with minor differences:
|
4751545 to
19d5956
Compare
| # Calculate index coordinates in the B-Spline grid | ||
| self._n_ctrl = n_ctrl or (len(timepoints) // 4) + 1 | ||
| def _preproces_data(self) -> np.ndarray: | ||
| # ToDo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the todo here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I recall correctly, the idea there is to preprocess only the data that is required; e.g. if we are using only a subset of the volumes, then we shouldn't preprocess the entire data, e.g.
nifreeze/src/nifreeze/model/dmri.py
Lines 122 to 124 in 4cfaadb
| data, _, gtab = self._dataset[idxmask] | |
| # Select voxels within mask or just unravel 3D if no mask | |
| data = data[brainmask, ...] if brainmask is not None else data.reshape(-1, data.shape[-1]) |
The data, _, gtab = self._dataset[idxmask] is commented out in here immediately after the ToDo to as a pointer to that idea.
|
|
||
| # Convert data into V (voxels) x T (timepoints) | ||
| data = data.reshape((-1, data.shape[-1])) if brainmask is None else data[brainmask] | ||
| # ToDo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Allowing _fit to override timepoints through kwargs duplicates information that was already validated and stored on self._x during initialization. Dropping that extra kwarg simplifies the API and avoids inconsistent inputs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure I understand: maybe the reply is answering a question elsewhere?
| data = data.reshape((-1, data.shape[-1])) if brainmask is None else data[brainmask] | ||
| # ToDo | ||
| # Does not make sense to make timepoints be a kwarg if it is provided as a named parameter to __init__ | ||
| timepoints = kwargs.get("timepoints", None) or self._x |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should be removed
| # ToDo | ||
| # Does not make sense to make timepoints be a kwarg if it is provided as a named parameter to __init__ | ||
| timepoints = kwargs.get("timepoints", None) or self._x | ||
| x = np.asarray(timepoints, dtype="float32") / self._xlim * self._n_ctrl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be replaced with
x = self._x / self._xlim * self._n_ctrl
mnoergaard
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @jhlegarreta. Thanks for pushing this forward. I have added some comments. And just to be clear on the notation: midframe is where the dataset stores real-world frame timing; timepoints/_x is the copy of those timings handed to the model; _xlim is the acquisition length used to scale those timings; and x is just the normalized coordinate derived from timepoints when solving or evaluating the B-spline. You agree?
56cf1b8 to
a4643c5
Compare
a4643c5 to
21026bb
Compare
Refactor PET model: use a base class that contains the essential
properties for a PET model and create a derived class that implements
the B-Spline correction.
- Make the `x_lim` private attribute be an array when assigning the
corresponding value at initialization to enable NumPy broadcasting.
- Remove the unnecessary explicit `float` casting around the number of
control points: the `dtype="float32"` specifier creates a float array.
Fixes:
```
Expected type 'str | Buffer | SupportsFloat | SupportsIndex', got 'Literal[0] | None | {__eq__} | int' instead
```
raised by the IDE.
- Use `np.asarray` to avoid extra copies when not necessary.
21026bb to
7d22d9e
Compare


Ractor PET model: use a base class that contains the essential properties for a PET model and create a derived class that implements the B-Spline correction.